17 research outputs found

    Feed-forward and visual feedback control of head roll orientation in wasps (Polistes humilis, Vespidae, Hymenoptera)

    Get PDF
    Flying insects keep their visual system horizontally aligned, suggesting that gaze stabilization is a crucial first step in flight control. Unlike flies, hymenopteran insects such as bees and wasps do not have halteres that provide fast, feed-forward ang

    Visual servo system based on a biologically inspired scanning sensor

    No full text
    International audienceIn the framework of our biologically inspired robotics approach, we describe a visually-guided demonstration model aircraft, the attitude of which is stabilized in yaw by means of a novel, non emissive optical sensor having a small visual field. This aircraft incorporates a miniature scanning sensor consisting of two photoreceptors with adjacent visual axes, driving a Local Motion Detector (LMD), which are made to perform a low-amplitude scanning at a varying angular speed. Under these conditions, the signal output from the motion detector varies gradually with the angular position of a contrasting object placed in its visual field, actually making the complete system a non-emissive optical "position sensor". Its output, remarkably, (i) varies quasi-linearly with the angular position of the contrasting object, and (ii) remains largely invariant with respect to the distance to the object and its degree of contrast. We built a miniature, twin-engine, twin-propeller aircraft (weight 100 grammes) equipped with this visual position sensor. After incorporating the sensor into a visuomotor feedback loop enhanced by an inertial sensor, we established that the "sighted aircraft" can fixate and track a dark edge placed in its visual field, thus opening the way for the development of visually-guided systems for controlling the attitude of micro-air vehicles, of the kind observed in insects such as hover-flies

    LinLED: Low latency and accurate contactless gesture interaction

    No full text
    International audienceAn innovative gesture interface called LinLED is described. Tradi- tional interfaces often suffer from limited tracking range and sig- nificant latency when locating fingers or hands. However, LinLED presents a game-changing solution. LinLED comprises a 1D-array of photodiodes and offers great precision in locating hands or fin- gers moving laterally in the air. Its accuracy is as small as 1 mm, which is ten times smaller than the photodiode pitch. Moreover, the latency is as fast as 1 ms, all thanks to its pure analog processing. LinLED can accurately detect any object reflecting infrared within a range of approximately 30 cm by 40 cm. Additionally, LinLED is sensitive to vertical motion (Z-axis motion), enabling the detection of three different types of gestures: selection, swipe, and graded smooth lateral movement. LinLED represents a significant advance- ment in gesture interfaces, offering high precision, minimal latency, and the ability to detect various gestures effectively

    Toward optic flow regulation for wall-following and centring behaviours

    No full text
    In our ongoing project on the autonomous guidance of Micro-Air Vehicles (MAVs) in confined indoor and outdoor environments, we have developed a bio-inspired optic flow based autopilot enabling a hovercraft to travel safely, and avoid the walls of a corridor. The hovercraft is an air vehicle endowed with natural roll and pitch stabilization characteristics, in which planar flight control can be developed conveniently. It travels at a constant ground height (~2mm) and senses the environment by means of two lateral eyes that measure the right and left optic flows (OFs). The visuomotor feedback loop, which is called LORA(1) (Lateral Optic flow Regulation Autopilot, Mark 1), consists of a lateral OF regulator that adjusts the hovercraft's yaw velocity and keeps the lateral OF constant on one wall equal to an OF set-point. Simulations have shown that the hovercraft manages to navigate in a corridor at a "pre-set" groundspeed (1m/s) without requiring a supervisor to make it switch abruptly between the control-laws corresponding to behaviours such as automatic wall-following, automatic centring, and automatically reacting to an opening encountered on a wall. The passive visual sensors and the simple control system used here are suitable for use on MAVs with an avionic payload of only a few grams

    Toward optic flow regulation for wall-following and centring behaviours

    Get PDF
    In our ongoing project on the autonomous guidance of Micro-Air Vehicles (MAVs) in confined indoor and outdoor environments, we have developed a bio-inspired optic flow based autopilot enabling a hovercraft to travel safely, and avoid the walls of a corridor. The hovercraft is an air vehicle endowed with natural roll and pitch stabilization characteristics, in which planar flight control can be developed conveniently. It travels at a constant ground height (∼2mm) and senses the environment by means of two lateral eyes that measure the right and left optic flows (OFs). The visuomotor feedback loop, which is called LORA(1) (Lateral Optic flow Regulation Autopilot, Mark 1), consists of a lateral OF regulator that adjusts the hovercraft's yaw velocity and keeps the lateral OF constant on one wall equal to an OF set-point. Simulations have shown that the hovercraft manages to navigate in a corridor at a “preset” groundspeed (1m/s) without requiring a supervisor to make it switch abruptly between the control-laws corresponding to behaviours such as automatic wall-following, automatic centring, and automatically reacting to an opening encountered on a wall. The passive visual sensors and the simple control system used here are suitable for use on MAVs with an avionic payload of only a few grams

    Behavioural evidence for a visual and proprioceptive control of head roll in hoverflies (Episyrphus balteatus)

    Get PDF
    International audienceThe ability of hoverflies to control their head orientation with respect to their body contributes importantly to their agility and their autonomous navigation abilities. Many tasks performed by this insect during flight, especially while hovering, involve a head stabilization reflex. This reflex, which is mediated by multisensory channels, prevents the visual processing from being disturbed by motion blur and maintains a consistent perception of the visual environment. The so-called dorsal light response (DLR) is another head control reflex, which makes insects sensitive to the brightest part of the visual field. In this study, we experimentally validate and quantify the control loop driving the head roll with respect to the horizon in hoverflies. The new approach developed here consisted of using an upside-down horizon in a body roll paradigm. In this unusual configuration, tethered flying hoverflies surprisingly no longer use purely vision-based control for head stabilization. These results shed new light on the role of neck proprioceptor organs in head and body stabilization with respect to the horizon. Based on the responses obtained with male and female hoverflies, an improved model was then developed in which the output signals delivered by the neck proprioceptor organs are combined with the visual error in the estimated position of the body roll. An internal estimation of the body roll angle with respect to the horizon might explain the extremely accurate flight performances achieved by some hovering insects
    corecore